Skip to content

feat: add Avian as an LLM provider#3663

Open
avianion wants to merge 4 commits intoQuivrHQ:mainfrom
avianion:feat/add-avian-provider
Open

feat: add Avian as an LLM provider#3663
avianion wants to merge 4 commits intoQuivrHQ:mainfrom
avianion:feat/add-avian-provider

Conversation

@avianion
Copy link

Summary

  • Add Avian as a supported LLM supplier in quivr-core
  • Avian provides an OpenAI-compatible API (chat completions, streaming, function calling) at https://api.avian.io/v1
  • Uses ChatOpenAI from langchain-openai since the API is fully OpenAI-compatible — no new dependencies required

Models

Model Context Max Output Input $/1M Output $/1M
deepseek/deepseek-v3.2 164K 65K $0.26 $0.38
moonshotai/kimi-k2.5 131K 8K $0.45 $2.20
z-ai/glm-5 131K 16K $0.30 $2.55
minimax/minimax-m2.5 1M 1M $0.30 $1.10

Changes

  • Add AVIAN to DefaultModelSuppliers enum
  • Add Avian model configurations with context/output token limits to LLMModelConfig
  • Add AVIAN supplier handling in LLMEndpoint.from_config() (uses ChatOpenAI with default base URL https://api.avian.io/v1)
  • Add unit test for Avian endpoint configuration

Usage

from quivr_core.rag.entities.config import DefaultModelSuppliers, LLMEndpointConfig
from quivr_core.llm import LLMEndpoint

config = LLMEndpointConfig(
    supplier=DefaultModelSuppliers.AVIAN,
    model="deepseek/deepseek-v3.2",
    llm_api_key="your-avian-api-key",  # or set AVIAN_API_KEY env var
)
llm = LLMEndpoint.from_config(config)

Test plan

  • Unit test added: test_llm_endpoint_avian verifies correct ChatOpenAI instantiation, base URL, and token limits

cc @StanGirard @chloedia

Add Avian (https://api.avian.io) as a supported LLM supplier in
quivr-core. Avian provides an OpenAI-compatible API with access to
models like DeepSeek-V3.2, Kimi-K2.5, GLM-5, and MiniMax-M2.5.

- Add AVIAN to DefaultModelSuppliers enum
- Add model configs with context/output token limits
- Add AVIAN supplier handling in LLMEndpoint using ChatOpenAI
  (OpenAI-compatible, defaults to https://api.avian.io/v1)
- Add test for Avian endpoint configuration
@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. area: backend Related to backend functionality or under the /backend directory labels Feb 27, 2026
avianion and others added 3 commits March 3, 2026 19:47
… path issue

Update megaparse-sdk from 0.1.10 to 0.1.11 in docs lock files to satisfy
quivr-core's requirement of megaparse-sdk>=0.1.11. Also fix the
${PROJECT_ROOT} variable in the RTD build commands which was not being
resolved by uv, causing it to fall back to PyPI instead of using the
local quivr-core source. Fixed mkdocs configuration path to point to
the correct docs/mkdocs.yml.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The previous approach of using sed to fix the file:// path in the
lockfile didn't work because uv pip install can't resolve local file://
references in lockfiles. Instead, install quivr-core directly from the
local ./core directory first, then install the remaining dependencies
from the lock file (filtered to exclude the quivr-core line).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The uv-created venv doesn't include pip, so use 'uv pip install'
directly instead of '.venv/bin/python -m pip install'.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area: backend Related to backend functionality or under the /backend directory size:M This PR changes 30-99 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant